New Criteria for Comparing Global Stochastic Derivative-Free Optimization Algorithms
نویسندگان
چکیده
منابع مشابه
Derivative Free Trust Region Algorithms for Stochastic Optimization
In this article we study the following stochastic optimization problem. Let (Ω,F ,P) be a probability space. Let ζ(ω) (where ω denotes a generic element of Ω) be a random variable on (Ω,F ,P), taking values in the probability space (Ξ,G,Q), where Q denotes the probability measure of ζ. Suppose for some open set E ⊂ R, F : E × Ξ → R is a real-valued function such that for each x ∈ E , F (x, ·) :...
متن کاملBenchmarking Derivative-Free Optimization Algorithms
We propose data profiles as a tool for analyzing the performance of derivativefree optimization solvers when there are constraints on the computational budget. We use performance and data profiles, together with a convergence test that measures the decrease in function value, to analyze the performance of three solvers on sets of smooth, noisy, and piecewise-smooth problems. Our results provide...
متن کاملExperimental Comparisons of Derivative Free Optimization Algorithms
— In this paper, the performances of the quasi-Newton BFGS algorithm, the NEWUOA derivative free optimizer, the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), the Differential Evolution (DE) algorithm and Particle Swarm Optimizers (PSO) are compared experimentally on benchmark functions reflecting important challenges encountered in real-world optimization problems. Dependence of the...
متن کاملAn Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization
We consider an unconstrained problem of minimization of a smooth convex function which is only available through noisy observations of its values, the noise consisting of two parts. Similar to stochastic optimization problems, the first part is of a stochastic nature. On the opposite, the second part is an additive noise of an unknown nature, but bounded in the absolute value. In the two-point ...
متن کاملStudent's t Distribution based Estimation of Distribution Algorithms for Derivative-free Global Optimization
In this paper, we are concerned with a branch of evolutionary algorithms termed estimation of distribution (EDA), which has been successfully used to tackle derivative-free global optimization problems. For existent EDA algorithms, it is a common practice to use a Gaussian distribution or a mixture of Gaussian components to represent the statistical property of available promising solutions fou...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Advanced Computer Science and Applications
سال: 2019
ISSN: 2156-5570,2158-107X
DOI: 10.14569/ijacsa.2019.0100781